📫 E-mail : [email protected]
📥 Github : 🖱Github
💻 Velog : 📎Velog
📞 Phone : 010-4909-4946
LAB. Xfact Lab
2024.03 -
Continuation of "Training Language Models With Pause Tokens"(ICLR 2024, Google Research)
This ongoing research adopts a methodology where, during the generation of the K+1 token, not only the previous K steps are utilized but also incorporates dummy [PAUSE] tokens, increasing the computation steps to K+M.
Unlike the original paper’s approach of appending 10 [PAUSE] tokens to the suffix of the prompt, this study focuses on analyzing the effect of noise token with measuring likelihood and attention flows in reasoning task.
Moreover, this research employs fine-tuning instead of pre-training utilized in previous studies, aiming to enhance performance and efficiency.
2017.03 - 2024.02
GPA. 4.4 / 4.5 (98.9/100)
Major GPA. 4.35 / 4.5
Scholarship.
Received Korea University’s Best Student Award every semester.
Research Intern
Developing idea of <Learning from Self-Sampled Correct and Partially> paper and generalized in general code.
Conducting research on introducing blocking techniques to derive the state from general code and implementing it as a tool of verification to learn the model through it.
Among the incorrect answer codes generated by the model, the code that generated the appropriate variable state in the middle is used as evidence of partially correct answer and used for resampling.
Language Lab Text Analytics Squad
Internship
127 categories classification of consultation types.
320 categories classification of VOC type based on STT text.
Sentiment Analysis on 5 level.
Based on Generation Model, extracted Noun phrase and make it as a prompt for NER.
Korean Version PLM of <From Clozing to Comprehending: Retrofitting Pre-trained Language Model to Pre-trained Machine Reader> Experiment
2023.02 - 2023.06
As a core member participate in google solution challenge
NLP Track
2022.03 - 2022.06
Learned the essentials for an ML Engineer.
Participated in 3 NLP competitions and projects, 1 data production project.
Management and Mentor
2022.06 - 2022.12
Leaded paper review study about Computer Vision and participated in DACON as a team.
Management and Mentor
2021.02 - 2023.06